From:                              route@monster.com

Sent:                               Tuesday, June 04, 2013 3:54 PM

To:                                   hg@apeironinc.com

Subject:                          Please review this candidate for: Big Data

 

This resume has been forwarded to you at the request of Monster User xapeix01

Confidential Resume

Last updated:  06/03/13

Job Title:  no specified

Company:  no specified

Rating:  Not Rated

Screening score:  no specified

Status:  Resume Received

Quick View Links:

Resume Section

Summary Section

 

 

RESUME

  

Resume Headline: Big Data

Resume Value: u9pzvxze9g4tdfwu   

  

 

             

bigdatasolution@gmail.com                            1-201-484-8695

Successful evangelist with extensive background in the hands-on leadership and delivery of technology-enabled initiatives.

17+ years of IT experience leading programs, strategic implementation, application development teams with experience in Datawarehousing and datamarts, Data integration, Data mining, Business intelligence and statistical analytics. Am a practitioner on the field with a very hands-on approach. Built my own JVM (Java Virtual Machine).

4 years of experience in design and implementation of large systems with data volumes of the order of 35-50 terabytes. Built the Enterprise Solutions Architecture, not just the Data Architecture.

Proven experience in data engineering and data analytics.  Functions span across data architecture, data modeling (conceptual, logical and physical), data mining, data integration and optimization, data stores, data analytics and data visualization using traditional approaches and big data approach.

 

Authored Hadoop papers which can be shared on request:

1)       Best practices in Hadoop

2)       Compression in hadoop and cluster rebalancing.

Analyze anything using Data-Thanksgiving data, eating a burger in Manhattan.

Authoring: Unlock the Potential by getting insights “Turning Data to Dollars in the Value Chain”

 

Certified PMP working with top tier clients - defining requirements; modeling solutions using key technologies (J2EE, EAI, Finance Models, columnar DBs, SOA, latency, Hadoop, UML, RUP, messaging, ERP, CEP etc).

Led teams and worked across continents-US, Europe, Singapore, Switzerland, Spain, UK, India.

Core competencies: Strategy planning and execution, Executive team leadership, New program execution strategy, Captive and offshore CoEs, Best practices, Process improvement, Change management.

Have been a regular speaker at industry conferences on Hadoop, HFT, latency monitoring, trading systems, cloud computing, building a single multi-asset class trading strategy etc. Speaker at Hadoop conference 2012. Will be speaking at Hadoop Conference in September. Speaker at NA Financial Information Summit on May 21st.

 

USP: Accomplishment at various levels- combination of Program Head, Data Scientist or more of a Quantitative Analyst, Operational Analytics, Business Intelligence or Discovery .

Program managed teams for Design and Development of Business Intelligence Solutions, ETL transformations, Data Modeling, Data Warehousing. 

Strong background in definition, execution and management of all aspects of software development(SDLC) - project management, requirements definition, business and system modeling, design, application development, change control, testing, training and configuration management.

 

              PROFESSIONAL EXPERIENCE

US Consulting Organization              2012-current

Current Client: Morgan Stanley, Technical Program Manager /Lead Solution Architect

Lead multiple initiatives on big data with finance heads and business managers, program managers, technology evaluation and selection, developing proof-of-concepts and implementations including base models for MS globally.

·         Led innovation by exploring, investigating, recommending, benchmarking and implementing data centric technologies for big data. Mapping of Big Data management technologies, Hadoop, HDFS, MapReduce and related areas.

·         Spearhead various solution teams in Architecting, Designing and Implementation using data collection, Hadoop, messaging, algorithms, analytics and visualization technologies.

·         Commerce Model: Pioneer the entire program from Discovery-to-Decision Making using operational insights with minimal latency via visualization. Lead the conceptual, logical and physical data model.

·         This program is a convergence of BIG data, data discovery, business intelligence and analytics.

·         The Commerce Model is designed to implement a common trade and asset representation across all asset classes and functions. It includes end-to-end trade capture through risk management to the subledger as a “Single Source of Truth”.

·         Led the design on Columnar datastore –Olympus and Optimus. Now building a correlation DB with algorithms defined, not just a columnar DB. Led the POC for 9 columnar databases including Greenplum, Vertica, Cassandra, Teradata etc.

·         Spearhead the Map-Reduce functionality and sequence/chaining of MapReduce jobs.  Defined the MR functionality for job trackers and task trackers. Used HiveQL for queries. Pig was used for mapreduce.

·         Set-up of Sqoop mapping for the bi-directional transfer between current trade systems and risk systems and Hadoop. The entire workflow was designed for the new model(conceptual, logical and analytical) and Oozie was set-up for Hadoop branching. Championed the fuse implementation which helped us to use normal commands. The synchronization between messaging API or pub-subs was separate and Hadoop cluster syncing was using zookeeper configuration.

·         Spearhead the data integration process using Platfora and Pentaho.

·         The end goal is: Each market model, algo and market instance gives a different pricing and validation result. So analyze last 1 year data, run the algos and see the best results using R on Hadoop including risk. This should be with minimal latency and help us improve trade profits giving the right signs to trade.

·         Lead the requirements gathering, business flow, system and design modeling using UML and RSA, RUP.

·         Designed the functional architecture, data strategy using columnar database – a bitemporal versioned database. Revamped the Enterprise Reference Data as CRD, SRD and ERD while using HBase for real-time read/write.

·         Spearhead the development of Pioneer Prime message bus for front to back, risk consolidation and DSL system using Google protocol buffers and Object API and wire API.

·         Orchestrated GRAIN for risk consolidation. Managed the requirements, architecture and design of RICE pricing and valuation engine for different asset classes and federal regulation.

Currently systems using Commerce Model are: Rates Derivatives, FX Options, Credit, Equity derivatives, equity swaps and equity cash which proves the success of this global program.

 

Big data processing and analytics for MSWIM: Led the planning, implementation for MSWIM. This was demoed at the expo and new applications are being built using this underlying framework for wealth management, fraud analysis, client upsell and cross-sell etc. This application provides new insights for growing revenues, reduce risk and improve operational efficiency. More can be discussed in person.

·         The various system components included Hadoop pipeline, Integration engine, Analytical foundation including statistical library and visualization engine which will be expanded for entire MSWIM. Chuckwa was used for log collection and processing.

 

US Consulting Organization

Client: Citi, Sr. Technical Program Manager               Jan 2012-Aug 2012     

CFPS –Customer Fraud prevention System

·         Drove all elements of the end-to-end technology platform required to support the business (including operations, finance, corporate risk and underlying infrastructure).

·         This program detects abnormal activities and warns against fraud to customer accounts distributed by account type / location.

·         It drills down to account level details, usual spend patterns, customer behavior profile, key risk contributing events.

·         Led the database architecture and GUI, data aggregation, hadoop cluster, data implementation, data storage and data distribution. Build the entire solution using Hadoop components, connectors, messaging framework, various algos, mapreduce functionality, analytics and visualization tool.

 

US Consulting Organization              2008-2011

Client: Bank Of America, Program Manager and Lead Solution Architect

Outstanding relationship builder, managed business expectations and gained the trust and respect of managing directors, strategic partners. Maintain quality standards, best business practices, lead high-impact change initiatives.

·         Responsible for program assessment, recommendations, suggestions, areas of improvement, audit and assurance.

·         Leading a significant number of operations system migrations /implementations as part of a larger transformation & change strategy.

·         Spearhead framework, architecture and standards for a new data warehouse platform using agile and scrum methodology.

Insights for valuation: Managed the program with a team of quants / developers building an investment recommendation platform that is able to recommend investment opportunities based on ideas from different sources (equity/bond researches, twitter data, finviz, the street recommendations etc. and portfolio diagnostics). Major milestone reported by Wall Street Journal. Used Tableau for filtering, heat maps, Pareto chart, cascading filters and co-relations.

 

Data Clustering for Model Analysis(DaCMA): Spearhead the engagement for Cluster analysis with GPU acceleration for q/kdb+ columnar DB

·         Data size on a daily basis was 18M+ stock trades, 2B+ option ticks, 550M+ ticks across exchanges. The historical data stored in kdb+ and retrieved through q using column vectors for feeding into GPUs. Deployed CUDA for GPU which enhanced the parallel computing performance using C++.

 

Credit Suisse, Vice President              2007-2008

Serve as the tactical leader building the technology roadmap and ensuring global and local implementations of programs.

·         Responsible for the implementation of risk management and real-time market data systems. Build the BIG-Reference Data Model and reconciliation across asset classes.

·                      Led the implementation of Molan-Mortgage and Loan analysis system.  The QC Audit Application replaced the existing Loan Auditing Process where an Excel Spreadsheet was used to Audit, Read and verify all the loan deficiencies and other information.

Mortgage credit risk model: Champion, design and develop mortgage loan level credit risk models. Main responsibilities include: data preparation, data analysis, risk factors design/extraction, risk profiling and implementation. Risk models were deployed and incorporated into their in-house automated underwriting loan origination systems. This encompassed survival analysis, PD, LGD and EAD.

·                      Spearheaded project management including risk management and contingency planning, dashboards, pricing, resource management, managing offshore teams and vendors, proposals, client management, brainstorming sessions, IT governance and compliance, best practices and guidelines, meeting targets.

·         Create new methodologies, reusable components and blueprints, best practices and delivery alliances for specific deliveries. Work seamlessly with counterparts in other regions to provide solutions to the global business.

             

UBS, Sr. Project Manager and Lead Technical Architect              2005-2007

·         Built the future state trading system-functional and architecture for UBS. This combined Equity and Fixed Income Systems along with risk management and trade execution, trade capture and settlement including OBS transactions.

·         Responsibilities included development and implementation of real-time risk and pricing models, maintenance and development of analytical calculation framework.

·         Build the EDW. This integrated the data handled by multiple businesses to a single point where the headquarters can look at the overall performance and risk reports. EDW is sourced from multiple businesses in multiple systems which is modeled using FSLDM in third normalized form and access layer built on top of it on star schema for downstream reporting.

·         Systematized precise simulation models and used diverse array of methods and special analyses to meet evolving client needs. Analyzed time trends in financial performance indicators, quantified results, and recommended program improvements regarding risk assessment and workload administration.

 

Unisys Corporation, Vice President              2002-2005

              Work at different levels for a mutual fund, hedge fund and a reputed investment bank.

·         Built the Trade/Product Lifecycle system for trade capture, pricing, booking and structuring.

·         Transform the operations of a major bank, leading a team of 70+ employees and client resources across organizational change, business process reengineering and technology delivery using an inbuilt DW.

·         Responsible for the specification and implementation of large trading, risk management and real-time market data systems and trading simulations, automation of various strategies.

·         Automated real-time order flow execution network. Implement best execution, facilitation for derivatives.

 

              Andersen Consulting, Lead Technical Architect/PM -Advanced Financial Technology group               2000-2002

·         BSCH –Trading Solution Architecture developed for the largest bank in Spain.

Implemented the TSA (Trading Solution Architecture) which integrated all business processes. Built an ODS.

·         Front office implementation for Investment Planning, Advisory and portfolio management for Fidelity.

Look at http://netbenefits.fidelity.com . Project POTOM(Portfolio and Trade Order Management System).

 

Citigroup              1994- 2000

Grown from developer to technical architect to technical PM

 

EDUCATION

Ø       MBA - Finance from IIM(Ahmedabad). Ranked as the toughest management school in the world to get admitted to.

Ø       MS in Financial Engineering from Columbia University

Ø       CA -15th All India rank

Ø       CFA - Medals in various groups

Ø       BE, DDE - Electronics and Telecom Engineering (6th in Univ and 2nd in Univ resp.)

 

ADDITIONAL INFORMATION

·         Certificate Course in Derivatives

·         Certificate in Futures and Options Program

·         ASF Securitization Institute - Applied Securitization and Modeling Certification

·         National Maths Olympiad Champion

 

              Prominent Certifications include:

PMP from PMI

UML 1.4, 2.0

Calypso 9.3, 10.0

Rational - RUP, Rose, XDE

Java Certified Senior Architect, Java Web Component Developer

EAI - Tibco, MQ Series (Websphere MQ)

ERP - SAP

Weblogic Developer Certification

Six Sigma Green belt

 

Technical Skills Summary

Data collection and  Storage, Analytics and Financial Software

Hadoop and components like hive, HBase, Oozie, PIG, zookeeper, Sqoop, Chuckwa etc.

Cloud Computing- Amazon EC2, Elastic MapReduce, Hadoop Cluster

Analytics: time series, OLS, Random forests, SVM, linear and non-linear regression, K-clustering, Splines and MARS, Lasso and Ridge regression etc.

Financial Software: Front Arena, TSA(Trading Solutions Architecture), OMS, EMS, DMA, Intex, multi-threading, Numerix, SciComp, Quantifi, TZero, Algorithmic trading, HFT and CEP-Coral 8, Apama, tick data-Vhayu, FinCad, Fidessa, Pat Systems, Models for Fixed income, rates, forex, equities and derivatives

Columnar Databases and visualization tools

Vertica, Greenplum, Splunk, Cassandra, kdB, R, Revolution R, tableau, Pentaho, Bigvis, Acunu, Platfora, Oracle, Sybase, Access, SQL, MySQL, eSQL

Messaging

Tibco, MQ Series, Webmethods, SeeBeyond, Java API

Market Data

Reuters, Bloomberg, Telekurs, Markit etc.

Architecture /services

TOGAF, Zachman, EA, SOA, CostXpert, webservices, SCA, grid computing, SaaS, Cloud computing, virtualization

Application servers

Websphere, Weblogic, Jetty, Apache, Eclipse, JWS, iPlanet, JBoss, NetBeans

ERP

SAP

Development

C++, J2EE, JDBC, JPA, JDO, RMI, C, C#, Unix with kernel tuning,  Ajax, Servlets, Struts, Swing, hibernate, LDAP, Corba, .NET framework, XSLT, VBA, Excel, JMS

OS

Unix, Linux, Windows

Business Modeling

Provision, ARIS, Bwise, Fuego, Rational, RUP, RSA, UML, TogetherSoft, Requisite Pro, ClearCase, ErWin, Toad, Visio

Languages

K, R, Java, C++, VB, XML, C#, UML, C, SQL, WML, MatLab

Team management

Onsite and Offshore team management with PMP certification

Process modeling

UML, BPM, Rational Rose, Rational Suite, Visio, TogetherJ, RSA, RSM

Methodologies

Agile, Scrum, Extreme Programming, RUP, CMMi, Waterfall, OOM, SOA, SDLC, UML

 

 

             



Experience

BACK TO TOP

 

Job Title

Company

Experience

Big data

CONFIDENTIAL

- Present

 

Additional Info

BACK TO TOP

 

Current Career Level:

Manager (Manager/Supervisor of Staff)

Work Status:

US - I am authorized to work in this country for any employer.

Active Security Clearance:

None

US Military Service:

Citizenship:

None

 

 

Target Job:

Target Job Title:

Big data

 

Target Company:

Company Size:

 

Target Locations:

Selected Locations:

US
US-NY-New York City

Relocate:

No